Application of frames in Chebyshev and conjugate gradient methods

نویسندگان

  • E. Afroomand Department of mathematics, Vali-e-Asr University of Rafsanjan, Rafsanjan, Iran.
  • H. Jamali Department of mathematics, Vali-e-Asr University of Rafsanjan, Rafsanjan, Iran.
چکیده مقاله:

‎Given a frame of a separable Hilbert space $H$‎, ‎we present some‎ ‎iterative methods for solving an operator equation $Lu=f$‎, ‎where $L$ is a bounded‎, ‎invertible and symmetric‎ ‎operator on $H$‎. ‎We present some algorithms‎ ‎based on the knowledge of frame bounds‎, ‎Chebyshev method and conjugate gradient method‎, ‎in order to give some‎ ‎approximated solutions to the problem‎. ‎Then we investigate the‎ ‎convergence and optimality of them.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Richardson and Chebyshev Iterative Methods by Using G-frames

In this paper, we design some iterative schemes for solving operator equation $ Lu=f $, where $ L:Hrightarrow H $ is a bounded, invertible and self-adjoint operator on a separable Hilbert space $ H $. In this concern,  Richardson and Chebyshev iterative methods are two outstanding as well as long-standing ones. They can be implemented in different ways via different concepts.In this paper...

متن کامل

Extensions of the Hestenes-Stiefel and Polak-Ribiere-Polyak conjugate gradient methods with sufficient descent property

Using search directions of a recent class of three--term conjugate gradient methods, modified versions of the Hestenes-Stiefel and Polak-Ribiere-Polyak methods are proposed which satisfy the sufficient descent condition. The methods are shown to be globally convergent when the line search fulfills the (strong) Wolfe conditions. Numerical experiments are done on a set of CUTEr unconstrained opti...

متن کامل

Towards Stochastic Conjugate Gradient Methods

The method of conjugate gradients provides a very effective way to optimize large, deterministic systems by gradient descent. In its standard form, however, it is not amenable to stochastic approximation of the gradient. Here we explore a number of ways to adopt ideas from conjugate gradient in the stochastic setting, using fast Hessian-vector products to obtain curvature information cheaply. I...

متن کامل

Conjugate Gradient Methods in Training Neural Networks

Training of artificial neural networks is normally a time consuming task due to iterative search imposed by the implicit nonlinearity of the network behavior. To tackle the supervised learning of multilayer feed forward neural networks, the backpropagation algorithm has been proven to be one of the most successful neural network algorithm. Although backpropagation training has proved to be effi...

متن کامل

منابع من

با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ذخیره در منابع من قبلا به منابع من ذحیره شده

{@ msg_add @}


عنوان ژورنال

دوره 43  شماره 5

صفحات  1265- 1279

تاریخ انتشار 2017-10-31

با دنبال کردن یک ژورنال هنگامی که شماره جدید این ژورنال منتشر می شود به شما از طریق ایمیل اطلاع داده می شود.

میزبانی شده توسط پلتفرم ابری doprax.com

copyright © 2015-2023